Goto

Collaborating Authors

 predict human behavior


What is quantum cognition? Physics theory could predict human behavior.

#artificialintelligence

The same fundamental platform that allows Schrödinger's cat to be both alive and dead, and also means two particles can "speak to each other" even across a galaxy's distance, could help to explain perhaps the most mysterious phenomena: human behavior. Quantum physics and human psychology may seem completely unrelated, but some scientists think the two fields overlap in interesting ways. Both disciplines attempt to predict how unruly systems might behave in the future. The difference is that one field aims to understand the fundamental nature of physical particles, while the other attempts to explain human nature -- along with its inherent fallacies. "Cognitive scientists found that there are many'irrational' human behaviors," Xiaochu Zhang, a biophysicist and neuroscientist at the University of Science and Technology of China in Hefei, told Live Science in an email.


AI Learns to Predict Human Behavior from Videos

#artificialintelligence

New York, NY--June 28, 2021--Predicting what someone is about to do next based on their body language comes naturally to humans but not so for computers. When we meet another person, they might greet us with a hello, handshake, or even a fist bump. We may not know which gesture will be used, but we can read the situation and respond appropriately. In a new study, Columbia Engineering researchers unveil a computer vision technique for giving machines a more intuitive sense for what will happen next by leveraging higher-level associations between people, animals, and objects. "Our algorithm is a step toward machines being able to make better predictions about human behavior, and thus better coordinate their actions with ours," said Carl Vondrick, assistant professor of computer science at Columbia, who directed the study, which was presented at the International Conference on Computer Vision and Pattern Recognition on June 24, 2021. "Our results open a number of possibilities for human-robot collaboration, autonomous vehicles, and assistive technology."


AI learns to predict human behavior from videos

#artificialintelligence

Predicting what someone is about to do next based on their body language comes naturally to humans but not so for computers. When we meet another person, they might greet us with a hello, handshake, or even a fist bump. We may not know which gesture will be used, but we can read the situation and respond appropriately. In a new study, Columbia Engineering researchers unveil a computer vision technique for giving machines a more intuitive sense for what will happen next by leveraging higher-level associations between people, animals, and objects. "Our algorithm is a step toward machines being able to make better predictions about human behavior, and thus better coordinate their actions with ours," said Carl Vondrick, assistant professor of computer science at Columbia, who directed the study, which was presented at the International Conference on Computer Vision and Pattern Recognition on June 24, 2021.


Autonomous Cars Can Predict How Selfish Your Driving Is

#artificialintelligence

Self-driving cars could soon be able to classify you as a selfish or altruistic driver. While this might bruise some egos, researchers from MIT CSAIL claim that this will make autonomous vehicles (AVs) much safer when driving alongside humans. Predicting how humans might behave, and adjusting an algorithm's reasoning based on how selfish or selfless their behavior might be, could dramatically reduce accidents between AI-enabled vehicles and humans. Properly integrating AI technology with the complicated and nuanced world of human behavior is a huge barrier to overcome, especially in applications that can make a difference between life or death. Apart from making self-driving cars safe enough for our streets, teaching AI how to comprehend the less quantifiable parts of life could give AI the ability to help humans in roles it previously could not handle, and could advance AI applications in general.


Teaching machines to predict human behaviors

#artificialintelligence

I've written a few times recently about various projects that are helping develop robots capable of learning by watching how a task is performed. The machines typically observe something like a YouTube video in their attempt to pick up the skill, but the tasks have generally been quite manual in nature, such as the correct handling of kitchen utensils. A recent study from researchers at MIT set out to test whether machines could also use a similar method for picking up something so intuitively human. The researchers were hoping to train robots to be able to instinctively predict how an encounter with a human might unfold. It's the kind of intuition that we develop subconsciously as a result of our lifetime of experience.


MIT researchers use TV to train computers to predict human behavior

#artificialintelligence

There's a lot that artificial intelligence can do, but understanding human behavior isn't one of the strong suits. A team at MIT's Computer Science and Artificial Intelligence Laboratory wants to change that. Researchers essentially turned computers into couch potatoes by feeding them hundreds of hours of footage from popular TV shows like "The Office," "Scrubs," and "Desperate Housewives," NPR reported Tuesday. Each clip ends with one of four actions: a hug, a kiss, a high five, or a handshake. Predict which one is about to happen.


Algorithm Binge Watches TV To Predict Human Behavior

#artificialintelligence

Watching TV can be a very educational experience for a computer. In a paper that will be presented this week at the International Conference on Computer Vision and Pattern Recognition, researchers at MIT's Computer Science and Artificial Intelligence Lab (CSAIL) created an algorithm that can predict how humans will behave in certain situations. The algorithm'watched' 600 hours of TV shows culled from clips posted on YouTube, including The Office, Big Bang Theory, and Desperate Housewives. The purpose was to see if it could accurately predict what humans would do during an interaction--would they shake hands? After feeding it the background material, the researchers had the algorithm watch new clips, and froze the clip just before an action was about to happen, and asked the algorithm to predict what happened next. That's worse than humans, who were able to correctly predict what would happen 71 percent of the time.